Skip to main content

Matrix Representations of Operators

In the previous sections, we introduced the state vector formalism of quantum mechanics, as well as some important results derived from it. In this section, we will explore how operators in quantum mechanics can be represented as matrices. Additionally, we will discuss shortly on tensor products and entangled composite systems.

Table of Contents

Operators as Matrices

Recall from the completeness relation that the identity operator can be written as a sum of projection operators:

For an operator , we can apply this relation twice to obtain:

If the ket space is -dimensional, then the quantity in the middle, , has possible values. Furthermore, given all of these values, we can uniquely determine the operator . As such, to a certain extent, can be represented by these values. We can put these values in a matrix, where the -th row and -th column contains the value :

I, along with Sakurai, will use the notation to mean that is represented by the matrix on the right-hand side.

An important property of these matrices is as follows: Take and move to the left (and take its Hermitian adjoint) to get . Then, by the conjugate symmetry of the inner product, it is equal to .

As such, the -th element of is the complex conjugate of the -th element of . In other words, is the conjugate transpose of .

Matrix Multiplication

Given two operators and , we can multiply them to get a new operator . Now we shall show that the matrix representations with follow the same rules as matrix multiplication.

Normally, for two matrices and , the -th element of the product is given by:

Thus, we would want to show the following:

This is much easier to prove than it seems. Since , we can write:

Now, simply use the completeness relation between the two operators to insert a sum over :

which is exactly what we needed to show.

Kets as Column Vectors

Kets can also be represented as column vectors.

Recall that kets can be expanded into linear combinations of basis kets:

This is similar to the Euclidean case where a vector can be expanded as a linear combination of basis vectors:

In the Euclidean case, we say that is the -th component of . Likewise, we can say that is the -th component of . Then, we can represent as a column vector:

It is often very beneficial to include the basis states in row vectors like this:

(Notice that we no longer use the notation here, as this is not a representation of a ket - this is the ket.)

Next, suppose we apply an operator to to get a new ket . By the rules of matrix multiplication, we would expect the -th component of to be:

This is quite similar to the matrix multiplication rule we derived earlier. Simply set and apply the completeness relation again.

Bra Vectors as Row Vectors

Bra vectors can be represented as row vectors. To find out how, suppose we apply an operator to a bra to get a new bra . The inner product of with any base ket is, by the completeness relation:

The term is the -th element of the matrix representation of . Thus, the other part, , should be the -th component of . This suggests that can be represented as a row vector:

Similarly to vectors, a complete picture of a bra can be obtained with a product of a row vector and a column vector:

where is the -th basis bra, defined such that . They are also the Hermitian conjugates of the basis kets (see this section for more details and proofs). Notice that this time, the components are placed in the row matrix, instead of the column matrix like kets. This will be important when we discuss change of basis.

Since the inner product is conjugate symmetric, we can equivalently write the components of as:

The inner product of with a ket is then, by the rules of matrix multiplication:

which aligns with simply adding a completeness relation to the original expression.

A more complete way to write the inner product comes from writing both the bra and ket with their bases:

Outer Products

An outer product is a product of a ket and a bra in an order such that the ket is on the left. We have previously shown that outer products are not scalars (like inner products), but rather operators.

To build intuition, we borrow from Euclidean vectors. Suppose we take the product of (a ket) and (a bra). Below I use a trick to perform this multiplication outlined in the appendix.

So indeed, outer products are operators (matrices) that act on vectors.

In quantum mechanics, then, suppose we want to take the outer product . By matrix multiplication:

There is actually something deeper going on here. The outer product is a matrix that acts on vectors. That begs the question: is there a way to take the product of more complicated things to get another operator? For instance, perhaps we could take two matrices and somehow combine them to get a new operator. This action is actually the tensor product, denoted by . Sometimes, the outer product is written as .

Short Note on Tensor Products

If one recalls, the space of a coupled system is the tensor product of the individual spaces. Virtually any tensor we usually use can be created by taking the tensor products of vectors and linear forms (or bra vectors, 1-forms, covectors, etc.). In fact, this is a way to define a tensor - a collection of vectors and linear forms combined by the tensor product.

Most operators cannot be represented as outer products, since they are just projection operators. For a more concrete explanation, consider the matrix multiplication that forms the outer product - notice that both columns are scaled versions of the input vector. This means that they are linearly dependent, and so the matrix is singular (i.e. it has a determinant of zero). This is why operators must be represented as a linear combination of outer products, rather than just a single outer product.

Similarly, some coupled systems cannot be represented as the tensor product of individual systems. Systems that cannot be represented as the tensor product of individual systems are called entangled systems. Entanglement is a crucial concept in quantum mechanics, and it is a key feature that distinguishes quantum mechanics from classical mechanics. We will explore the physical implications of entanglement in more detail in future sections.

Exercises

  1. Suppose we have a system where the state space is two-dimensional with basis states and . (Think of spin-1/2 particles or qubits used in quantum computing.)

    • Write down the matrix representation of the operator that flips the state of the system. That is, and .

      Solution

      Before writing the components, consider the dimensions of the matrix. Since the state space is two-dimensional, the matrix will be .

      The -th element of the matrix is given by . In this case, we have:

      Now, we can fill in the components. Since and , we have:

      This matrix is known as the Pauli-X matrix. We will see more of these matrices in the future, and analyze them in detail when we discuss angular momentum.

    • Suppose theres now two systems coupled together. Show that the Bell state is an entangled state. For this problem, you can assume that tensor products act like normal products, i.e. .

      Solution

      We can show this by a proof by contradiction.

      Suppose that is not entangled. Then, we can write it as a tensor product of two states and :

      Now, we can expand and in terms of the basis states and :

      Then, the tensor product is:

      Expanding this out, we get:

      (where is shorthand for ).

      Now, compare this to the Bell state . We need and .

      Multiplying the first two equations, we get . But this means that none of , , , or can be zero, which contradicts .

      Thus, because we cannot find a way to write as a tensor product of two states, it is an entangled state.

Summary and Next Steps

In this rather brief section, we have shown how the various objects in quantum mechanics can be represented as matrices. This is a crucial step in understanding how quantum mechanics can be formulated in terms of linear algebra.

Here are the key points to remember:

  • Operators in quantum mechanics can be represented as matrices. Specifically, the matrix elements of an operator are given by .
  • Kets can be represented as column vectors, and bras as row vectors.
  • Outer products are operators that act on vectors. They can be thought of as the tensor product of a ket and a bra.
  • If a system cannot be represented as the tensor product of individual systems, it is an entangled system.

In the next section, we continue our exploration of vectors and operators by looking at the change of basis.

References

  • J.J. Sakurai, "Modern Quantum Mechanics", section 1.3.

Appendix: Quick Trick for Matrix Multiplication

A quick trick to perform matrix multiplication is as follows: Given the product of two matrices , shift up and to the right, and then sum the products of the corresponding elements in the two matrices. To see what I mean, consider the following matrices:

To multiply them, we can shift up and to the right:

Then, the elements in the product matrix are given by the sum of the products of the corresponding elements. For example, the element in the first row and first column is:

This also easily allows one to find out whether the dimensions of the matrices are compatible for multiplication. For example, the following matrices cannot be multiplied:

The 's would require something from a third row in the first matrix, which is not present.